Linear Regression and Gradient Descent Method for Electricity Output Power Prediction

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Semi-supervised structured output prediction by local linear regression and sub-gradient descent

We propose a novel semi-supervised structured output prediction method based on local linear regression in this paper. The existing semi-supervise structured output prediction methods learn a global predictor for all the data points in a data set, which ignores the differences of local distributions of the data set, and the effects to the structured output prediction. To solve this problem, we ...

متن کامل

Exponentiated Gradient Versus Gradient Descent for Linear Predictors

We consider two algorithms for on-line prediction based on a linear model. The algorithms are the well-known gradient descent (GD) algorithm and a new algorithm, which we call EG. They both maintain a weight vector using simple updates. For the GD algorithm, the update is based on subtracting the gradient of the squared error made on a prediction. The EG algorithm uses the components of the gra...

متن کامل

Privacy-preservation for Stochastic Gradient Descent Method

The traditional paradigm in machine learning has been that given a data set, the goal is to learn a target function or decision model (such as a classifier) from it. Many techniques in data mining and machine learning follow a gradient descent paradigm in the iterative process of discovering this target function or decision model. For instance, Linear regression can be resolved through a gradie...

متن کامل

A Gradient Descent Method for a Neural

| It has been demonstrated that higher order recurrent neu-ral networks exhibit an underlying fractal attractor as an artifact of their dynamics. These fractal attractors ooer a very eecent mechanism to encode visual memories in a neu-ral substrate, since even a simple twelve weight network can encode a very large set of diierent images. The main problem in this memory model, which so far has r...

متن کامل

Worst-case quadratic loss bounds for prediction using linear functions and gradient descent

Studies the performance of gradient descent (GD) when applied to the problem of online linear prediction in arbitrary inner product spaces. We prove worst-case bounds on the sum of the squared prediction errors under various assumptions concerning the amount of a priori information about the sequence to predict. The algorithms we use are variants and extensions of online GD. Whereas our algorit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computer and Communications

سال: 2019

ISSN: 2327-5219,2327-5227

DOI: 10.4236/jcc.2019.712004